Attentive Language Models

نویسندگان

  • Giancarlo Salton
  • Robert J. Ross
  • John D. Kelleher
چکیده

In this paper, we extend Recurrent Neural Network Language Models (RNN-LMs) with an attention mechanism. We show that an Attentive RNN-LM (with 14.5M parameters) achieves a better perplexity than larger RNN-LMs (with 66M parameters) and achieves performance comparable to an ensemble of 10 similar sized RNN-LMs. We also show that an Attentive RNN-LM needs less contextual information to achieve similar results to the stateof-the-art on the wikitext2 dataset.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

CAESAR: Context Awareness Enabled Summary-Attentive Reader

Comprehending meaning from natural language is a primary objective of Natural Language Processing (NLP), and text comprehension is the cornerstone for achieving this objective upon which all other problems like chat bots, language translation and others can be achieved. We report a Summary-Attentive Reader we designed to better emulate the human reading process, along with a dictiontarybased so...

متن کامل

Attentive Sequence-to-Sequence Learning for Diacritic Restoration of Yor\`ub\'a Language Text

Yorùbá is a widely spoken West African language with a writing system rich in tonal and orthographic diacritics. With very few exceptions, diacritics are omitted from electronic texts, due to limited device and application support. Diacritics provide morphological information, are crucial for lexical disambiguation, pronunciation and are vital for any Yorùbá text-to-speech (TTS), automatic spee...

متن کامل

Language-Based Image Editing with Recurrent Attentive Models

We investigate the problem of Language-Based Image Editing (LBIE) in this work. Given a source image and a natural language description, we want to generate a target image by editing the source image based on the description. We propose a generic modeling framework for two sub-tasks of LBIE: language-based image segmentation and image colorization. The framework uses recurrent attentive models ...

متن کامل

Irony Detection with Attentive Recurrent Neural Networks

Automatic Irony Detection refers to making computer understand the real intentions of human behind the ironic language. Much work has been done using classic machine learning techniques applied on various features. In contrast to sophisticated feature engineering, this paper investigates how the deep learning can be applied to the intended task with the help of word embedding. Three different d...

متن کامل

Attentive Interfaces for Multiple Monitors

The use of multiple monitors is becoming popular but it creates problems for pointer movement, window management, and control of applications. We suggest combining the standard input devices of keyboard and mouse with attentive user interface techniques. A user’s work with conventional input devices will naturally be based around a focal region, but that focus can be rapidly transferred in resp...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017